[ Wed Sep 28 02:22:09 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:22:23 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/cview/fc_bone', 'model_saved_name': 'work_dir/ntu60/cview/fc_bone/runs', 'config': 'config/nturgbd-cross-view/fc_bone.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CV.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': False, 'bone': True}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CV.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': False, 'bone': True, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [5], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:22:23 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:22:23 2022 ] Training epoch: 1
[ Wed Sep 28 02:25:25 2022 ] 	Mean training loss: 2.8891. loss2: 0.0000. Mean training acc: 25.11%.
[ Wed Sep 28 02:25:25 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:25:25 2022 ] Eval epoch: 1
[ Wed Sep 28 02:25:59 2022 ] 	Mean test loss of 296 batches: 1.9107323530557994.
[ Wed Sep 28 02:25:59 2022 ] 	Top1: 42.44%
[ Wed Sep 28 02:25:59 2022 ] 	Top5: 82.44%
[ Wed Sep 28 02:25:59 2022 ] Training epoch: 2
[ Wed Sep 28 02:28:57 2022 ] 	Mean training loss: 1.7381. loss2: 0.0000. Mean training acc: 48.59%.
[ Wed Sep 28 02:28:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:28:57 2022 ] Eval epoch: 2
[ Wed Sep 28 02:29:30 2022 ] 	Mean test loss of 296 batches: 1.3570017679720312.
[ Wed Sep 28 02:29:30 2022 ] 	Top1: 58.83%
[ Wed Sep 28 02:29:30 2022 ] 	Top5: 90.81%
[ Wed Sep 28 02:29:30 2022 ] Training epoch: 3
[ Wed Sep 28 02:32:28 2022 ] 	Mean training loss: 1.3388. loss2: 0.0000. Mean training acc: 59.53%.
[ Wed Sep 28 02:32:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:32:28 2022 ] Eval epoch: 3
[ Wed Sep 28 02:33:01 2022 ] 	Mean test loss of 296 batches: 0.9221678905793138.
[ Wed Sep 28 02:33:01 2022 ] 	Top1: 72.04%
[ Wed Sep 28 02:33:01 2022 ] 	Top5: 95.83%
[ Wed Sep 28 02:33:01 2022 ] Training epoch: 4
[ Wed Sep 28 02:35:59 2022 ] 	Mean training loss: 1.1469. loss2: 0.0000. Mean training acc: 65.17%.
[ Wed Sep 28 02:35:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:35:59 2022 ] Eval epoch: 4
[ Wed Sep 28 02:36:33 2022 ] 	Mean test loss of 296 batches: 0.8706984406186117.
[ Wed Sep 28 02:36:33 2022 ] 	Top1: 72.63%
[ Wed Sep 28 02:36:33 2022 ] 	Top5: 95.97%
[ Wed Sep 28 02:36:33 2022 ] Training epoch: 5
[ Wed Sep 28 02:39:31 2022 ] 	Mean training loss: 1.0271. loss2: 0.0000. Mean training acc: 68.49%.
[ Wed Sep 28 02:39:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:39:31 2022 ] Eval epoch: 5
[ Wed Sep 28 02:40:04 2022 ] 	Mean test loss of 296 batches: 0.9513041108242564.
[ Wed Sep 28 02:40:04 2022 ] 	Top1: 70.77%
[ Wed Sep 28 02:40:04 2022 ] 	Top5: 95.11%
[ Wed Sep 28 02:40:04 2022 ] Training epoch: 6
[ Wed Sep 28 02:43:03 2022 ] 	Mean training loss: 0.9150. loss2: 0.0000. Mean training acc: 71.87%.
[ Wed Sep 28 02:43:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:43:03 2022 ] Eval epoch: 6
[ Wed Sep 28 02:43:36 2022 ] 	Mean test loss of 296 batches: 0.8855591768751273.
[ Wed Sep 28 02:43:36 2022 ] 	Top1: 73.75%
[ Wed Sep 28 02:43:36 2022 ] 	Top5: 95.06%
[ Wed Sep 28 02:43:36 2022 ] Training epoch: 7
[ Wed Sep 28 02:46:34 2022 ] 	Mean training loss: 0.8310. loss2: 0.0000. Mean training acc: 74.16%.
[ Wed Sep 28 02:46:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:46:34 2022 ] Eval epoch: 7
[ Wed Sep 28 02:47:08 2022 ] 	Mean test loss of 296 batches: 0.687693689219855.
[ Wed Sep 28 02:47:08 2022 ] 	Top1: 78.51%
[ Wed Sep 28 02:47:08 2022 ] 	Top5: 97.32%
[ Wed Sep 28 02:47:08 2022 ] Training epoch: 8
[ Wed Sep 28 02:50:06 2022 ] 	Mean training loss: 0.8007. loss2: 0.0000. Mean training acc: 75.20%.
[ Wed Sep 28 02:50:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:50:06 2022 ] Eval epoch: 8
[ Wed Sep 28 02:50:39 2022 ] 	Mean test loss of 296 batches: 0.9084460349703157.
[ Wed Sep 28 02:50:39 2022 ] 	Top1: 73.04%
[ Wed Sep 28 02:50:39 2022 ] 	Top5: 95.31%
[ Wed Sep 28 02:50:39 2022 ] Training epoch: 9
[ Wed Sep 28 02:53:38 2022 ] 	Mean training loss: 0.7534. loss2: 0.0000. Mean training acc: 76.49%.
[ Wed Sep 28 02:53:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:53:38 2022 ] Eval epoch: 9
[ Wed Sep 28 02:54:11 2022 ] 	Mean test loss of 296 batches: 0.6891071257760396.
[ Wed Sep 28 02:54:11 2022 ] 	Top1: 77.91%
[ Wed Sep 28 02:54:11 2022 ] 	Top5: 97.42%
[ Wed Sep 28 02:54:11 2022 ] Training epoch: 10
[ Wed Sep 28 02:57:09 2022 ] 	Mean training loss: 0.7285. loss2: 0.0000. Mean training acc: 77.37%.
[ Wed Sep 28 02:57:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:57:09 2022 ] Eval epoch: 10
[ Wed Sep 28 02:57:42 2022 ] 	Mean test loss of 296 batches: 0.584242037875024.
[ Wed Sep 28 02:57:42 2022 ] 	Top1: 81.19%
[ Wed Sep 28 02:57:43 2022 ] 	Top5: 97.61%
[ Wed Sep 28 02:57:43 2022 ] Training epoch: 11
[ Wed Sep 28 03:00:41 2022 ] 	Mean training loss: 0.7013. loss2: 0.0000. Mean training acc: 78.21%.
[ Wed Sep 28 03:00:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:00:41 2022 ] Eval epoch: 11
[ Wed Sep 28 03:01:14 2022 ] 	Mean test loss of 296 batches: 0.7324375372279335.
[ Wed Sep 28 03:01:14 2022 ] 	Top1: 76.37%
[ Wed Sep 28 03:01:14 2022 ] 	Top5: 97.02%
[ Wed Sep 28 03:01:14 2022 ] Training epoch: 12
[ Wed Sep 28 03:04:14 2022 ] 	Mean training loss: 0.6763. loss2: 0.0000. Mean training acc: 78.59%.
[ Wed Sep 28 03:04:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:04:14 2022 ] Eval epoch: 12
[ Wed Sep 28 03:04:47 2022 ] 	Mean test loss of 296 batches: 0.6309206029651938.
[ Wed Sep 28 03:04:47 2022 ] 	Top1: 79.41%
[ Wed Sep 28 03:04:47 2022 ] 	Top5: 97.42%
[ Wed Sep 28 03:04:47 2022 ] Training epoch: 13
[ Wed Sep 28 03:07:45 2022 ] 	Mean training loss: 0.6717. loss2: 0.0000. Mean training acc: 78.98%.
[ Wed Sep 28 03:07:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:07:45 2022 ] Eval epoch: 13
[ Wed Sep 28 03:08:19 2022 ] 	Mean test loss of 296 batches: 0.519779132427396.
[ Wed Sep 28 03:08:19 2022 ] 	Top1: 83.59%
[ Wed Sep 28 03:08:19 2022 ] 	Top5: 97.86%
[ Wed Sep 28 03:08:19 2022 ] Training epoch: 14
[ Wed Sep 28 03:11:17 2022 ] 	Mean training loss: 0.6529. loss2: 0.0000. Mean training acc: 79.51%.
[ Wed Sep 28 03:11:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:11:17 2022 ] Eval epoch: 14
[ Wed Sep 28 03:11:50 2022 ] 	Mean test loss of 296 batches: 0.6583886011629492.
[ Wed Sep 28 03:11:50 2022 ] 	Top1: 79.86%
[ Wed Sep 28 03:11:50 2022 ] 	Top5: 97.05%
[ Wed Sep 28 03:11:50 2022 ] Training epoch: 15
[ Wed Sep 28 03:14:49 2022 ] 	Mean training loss: 0.6355. loss2: 0.0000. Mean training acc: 80.03%.
[ Wed Sep 28 03:14:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:14:49 2022 ] Eval epoch: 15
[ Wed Sep 28 03:15:22 2022 ] 	Mean test loss of 296 batches: 0.6207979265097026.
[ Wed Sep 28 03:15:22 2022 ] 	Top1: 80.67%
[ Wed Sep 28 03:15:22 2022 ] 	Top5: 97.58%
[ Wed Sep 28 03:15:22 2022 ] Training epoch: 16
[ Wed Sep 28 03:18:20 2022 ] 	Mean training loss: 0.6324. loss2: 0.0000. Mean training acc: 80.33%.
[ Wed Sep 28 03:18:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:18:20 2022 ] Eval epoch: 16
[ Wed Sep 28 03:18:54 2022 ] 	Mean test loss of 296 batches: 0.5463380945494046.
[ Wed Sep 28 03:18:54 2022 ] 	Top1: 82.54%
[ Wed Sep 28 03:18:54 2022 ] 	Top5: 97.52%
[ Wed Sep 28 03:18:54 2022 ] Training epoch: 17
[ Wed Sep 28 03:21:53 2022 ] 	Mean training loss: 0.6115. loss2: 0.0000. Mean training acc: 81.07%.
[ Wed Sep 28 03:21:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:21:53 2022 ] Eval epoch: 17
[ Wed Sep 28 03:22:26 2022 ] 	Mean test loss of 296 batches: 0.6196232996377591.
[ Wed Sep 28 03:22:26 2022 ] 	Top1: 80.38%
[ Wed Sep 28 03:22:26 2022 ] 	Top5: 97.61%
[ Wed Sep 28 03:22:26 2022 ] Training epoch: 18
[ Wed Sep 28 03:25:24 2022 ] 	Mean training loss: 0.6076. loss2: 0.0000. Mean training acc: 81.04%.
[ Wed Sep 28 03:25:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:25:24 2022 ] Eval epoch: 18
[ Wed Sep 28 03:25:58 2022 ] 	Mean test loss of 296 batches: 0.5440996539955204.
[ Wed Sep 28 03:25:58 2022 ] 	Top1: 83.18%
[ Wed Sep 28 03:25:58 2022 ] 	Top5: 97.74%
[ Wed Sep 28 03:25:58 2022 ] Training epoch: 19
[ Wed Sep 28 03:28:57 2022 ] 	Mean training loss: 0.5890. loss2: 0.0000. Mean training acc: 81.58%.
[ Wed Sep 28 03:28:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:28:57 2022 ] Eval epoch: 19
[ Wed Sep 28 03:29:30 2022 ] 	Mean test loss of 296 batches: 0.4895806769664223.
[ Wed Sep 28 03:29:30 2022 ] 	Top1: 83.93%
[ Wed Sep 28 03:29:30 2022 ] 	Top5: 98.51%
[ Wed Sep 28 03:29:30 2022 ] Training epoch: 20
[ Wed Sep 28 03:32:28 2022 ] 	Mean training loss: 0.5810. loss2: 0.0000. Mean training acc: 81.69%.
[ Wed Sep 28 03:32:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:32:28 2022 ] Eval epoch: 20
[ Wed Sep 28 03:33:02 2022 ] 	Mean test loss of 296 batches: 0.6234328976354083.
[ Wed Sep 28 03:33:02 2022 ] 	Top1: 81.23%
[ Wed Sep 28 03:33:02 2022 ] 	Top5: 97.50%
[ Wed Sep 28 03:33:02 2022 ] Training epoch: 21
[ Wed Sep 28 03:36:00 2022 ] 	Mean training loss: 0.5657. loss2: 0.0000. Mean training acc: 82.33%.
[ Wed Sep 28 03:36:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:36:00 2022 ] Eval epoch: 21
[ Wed Sep 28 03:36:33 2022 ] 	Mean test loss of 296 batches: 0.46414328864901455.
[ Wed Sep 28 03:36:33 2022 ] 	Top1: 85.05%
[ Wed Sep 28 03:36:34 2022 ] 	Top5: 98.40%
[ Wed Sep 28 03:36:34 2022 ] Training epoch: 22
[ Wed Sep 28 03:39:32 2022 ] 	Mean training loss: 0.5723. loss2: 0.0000. Mean training acc: 82.16%.
[ Wed Sep 28 03:39:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:39:32 2022 ] Eval epoch: 22
[ Wed Sep 28 03:40:05 2022 ] 	Mean test loss of 296 batches: 0.49880341526020217.
[ Wed Sep 28 03:40:05 2022 ] 	Top1: 83.99%
[ Wed Sep 28 03:40:05 2022 ] 	Top5: 98.10%
[ Wed Sep 28 03:40:05 2022 ] Training epoch: 23
[ Wed Sep 28 03:43:03 2022 ] 	Mean training loss: 0.5573. loss2: 0.0000. Mean training acc: 82.51%.
[ Wed Sep 28 03:43:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:43:03 2022 ] Eval epoch: 23
[ Wed Sep 28 03:43:37 2022 ] 	Mean test loss of 296 batches: 0.5234723512989443.
[ Wed Sep 28 03:43:37 2022 ] 	Top1: 83.40%
[ Wed Sep 28 03:43:37 2022 ] 	Top5: 98.32%
[ Wed Sep 28 03:43:37 2022 ] Training epoch: 24
[ Wed Sep 28 03:46:35 2022 ] 	Mean training loss: 0.5636. loss2: 0.0000. Mean training acc: 82.20%.
[ Wed Sep 28 03:46:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:46:35 2022 ] Eval epoch: 24
[ Wed Sep 28 03:47:08 2022 ] 	Mean test loss of 296 batches: 0.4434270388320894.
[ Wed Sep 28 03:47:08 2022 ] 	Top1: 86.06%
[ Wed Sep 28 03:47:08 2022 ] 	Top5: 98.17%
[ Wed Sep 28 03:47:08 2022 ] Training epoch: 25
[ Wed Sep 28 03:50:07 2022 ] 	Mean training loss: 0.5535. loss2: 0.0000. Mean training acc: 82.68%.
[ Wed Sep 28 03:50:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:50:07 2022 ] Eval epoch: 25
[ Wed Sep 28 03:50:40 2022 ] 	Mean test loss of 296 batches: 0.47018757608492634.
[ Wed Sep 28 03:50:41 2022 ] 	Top1: 85.50%
[ Wed Sep 28 03:50:41 2022 ] 	Top5: 98.00%
[ Wed Sep 28 03:50:41 2022 ] Training epoch: 26
[ Wed Sep 28 03:53:40 2022 ] 	Mean training loss: 0.5450. loss2: 0.0000. Mean training acc: 82.74%.
[ Wed Sep 28 03:53:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:53:40 2022 ] Eval epoch: 26
[ Wed Sep 28 03:54:13 2022 ] 	Mean test loss of 296 batches: 0.6214263481465546.
[ Wed Sep 28 03:54:13 2022 ] 	Top1: 81.58%
[ Wed Sep 28 03:54:13 2022 ] 	Top5: 96.58%
[ Wed Sep 28 03:54:13 2022 ] Training epoch: 27
[ Wed Sep 28 03:57:11 2022 ] 	Mean training loss: 0.5506. loss2: 0.0000. Mean training acc: 82.77%.
[ Wed Sep 28 03:57:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:57:11 2022 ] Eval epoch: 27
[ Wed Sep 28 03:57:45 2022 ] 	Mean test loss of 296 batches: 0.4145922949285926.
[ Wed Sep 28 03:57:45 2022 ] 	Top1: 87.05%
[ Wed Sep 28 03:57:45 2022 ] 	Top5: 98.45%
[ Wed Sep 28 03:57:45 2022 ] Training epoch: 28
[ Wed Sep 28 04:00:43 2022 ] 	Mean training loss: 0.5470. loss2: 0.0000. Mean training acc: 82.72%.
[ Wed Sep 28 04:00:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:00:43 2022 ] Eval epoch: 28
[ Wed Sep 28 04:01:17 2022 ] 	Mean test loss of 296 batches: 0.5054631130014723.
[ Wed Sep 28 04:01:17 2022 ] 	Top1: 83.73%
[ Wed Sep 28 04:01:17 2022 ] 	Top5: 98.17%
[ Wed Sep 28 04:01:17 2022 ] Training epoch: 29
[ Wed Sep 28 04:04:15 2022 ] 	Mean training loss: 0.5406. loss2: 0.0000. Mean training acc: 82.91%.
[ Wed Sep 28 04:04:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:04:15 2022 ] Eval epoch: 29
[ Wed Sep 28 04:04:48 2022 ] 	Mean test loss of 296 batches: 0.4777591517950232.
[ Wed Sep 28 04:04:48 2022 ] 	Top1: 84.85%
[ Wed Sep 28 04:04:49 2022 ] 	Top5: 98.36%
[ Wed Sep 28 04:04:49 2022 ] Training epoch: 30
[ Wed Sep 28 04:07:47 2022 ] 	Mean training loss: 0.5357. loss2: 0.0000. Mean training acc: 83.14%.
[ Wed Sep 28 04:07:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:07:47 2022 ] Eval epoch: 30
[ Wed Sep 28 04:08:20 2022 ] 	Mean test loss of 296 batches: 0.41886246536631844.
[ Wed Sep 28 04:08:20 2022 ] 	Top1: 86.57%
[ Wed Sep 28 04:08:20 2022 ] 	Top5: 98.48%
[ Wed Sep 28 04:08:20 2022 ] Training epoch: 31
[ Wed Sep 28 04:11:18 2022 ] 	Mean training loss: 0.5281. loss2: 0.0000. Mean training acc: 83.49%.
[ Wed Sep 28 04:11:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:11:18 2022 ] Eval epoch: 31
[ Wed Sep 28 04:11:52 2022 ] 	Mean test loss of 296 batches: 0.7104622851553801.
[ Wed Sep 28 04:11:52 2022 ] 	Top1: 78.78%
[ Wed Sep 28 04:11:52 2022 ] 	Top5: 96.91%
[ Wed Sep 28 04:11:52 2022 ] Training epoch: 32
[ Wed Sep 28 04:14:50 2022 ] 	Mean training loss: 0.5276. loss2: 0.0000. Mean training acc: 83.51%.
[ Wed Sep 28 04:14:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:14:50 2022 ] Eval epoch: 32
[ Wed Sep 28 04:15:24 2022 ] 	Mean test loss of 296 batches: 0.623906972541197.
[ Wed Sep 28 04:15:24 2022 ] 	Top1: 81.35%
[ Wed Sep 28 04:15:24 2022 ] 	Top5: 97.19%
[ Wed Sep 28 04:15:24 2022 ] Training epoch: 33
[ Wed Sep 28 04:18:23 2022 ] 	Mean training loss: 0.5313. loss2: 0.0000. Mean training acc: 83.30%.
[ Wed Sep 28 04:18:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:18:23 2022 ] Eval epoch: 33
[ Wed Sep 28 04:18:56 2022 ] 	Mean test loss of 296 batches: 0.39358079252210826.
[ Wed Sep 28 04:18:56 2022 ] 	Top1: 87.12%
[ Wed Sep 28 04:18:56 2022 ] 	Top5: 98.71%
[ Wed Sep 28 04:18:56 2022 ] Training epoch: 34
[ Wed Sep 28 04:21:55 2022 ] 	Mean training loss: 0.5219. loss2: 0.0000. Mean training acc: 83.79%.
[ Wed Sep 28 04:21:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:21:55 2022 ] Eval epoch: 34
[ Wed Sep 28 04:22:28 2022 ] 	Mean test loss of 296 batches: 0.381803355406265.
[ Wed Sep 28 04:22:28 2022 ] 	Top1: 87.60%
[ Wed Sep 28 04:22:28 2022 ] 	Top5: 98.63%
[ Wed Sep 28 04:22:28 2022 ] Training epoch: 35
[ Wed Sep 28 04:25:27 2022 ] 	Mean training loss: 0.5185. loss2: 0.0000. Mean training acc: 83.51%.
[ Wed Sep 28 04:25:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:25:27 2022 ] Eval epoch: 35
[ Wed Sep 28 04:26:00 2022 ] 	Mean test loss of 296 batches: 0.4121900897070363.
[ Wed Sep 28 04:26:00 2022 ] 	Top1: 86.72%
[ Wed Sep 28 04:26:00 2022 ] 	Top5: 98.49%
[ Wed Sep 28 04:26:00 2022 ] Training epoch: 36
[ Wed Sep 28 04:28:59 2022 ] 	Mean training loss: 0.5205. loss2: 0.0000. Mean training acc: 83.56%.
[ Wed Sep 28 04:28:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:28:59 2022 ] Eval epoch: 36
[ Wed Sep 28 04:29:32 2022 ] 	Mean test loss of 296 batches: 0.3652793498508431.
[ Wed Sep 28 04:29:32 2022 ] 	Top1: 87.94%
[ Wed Sep 28 04:29:32 2022 ] 	Top5: 98.91%
[ Wed Sep 28 04:29:32 2022 ] Training epoch: 37
[ Wed Sep 28 04:32:30 2022 ] 	Mean training loss: 0.5101. loss2: 0.0000. Mean training acc: 83.95%.
[ Wed Sep 28 04:32:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:32:30 2022 ] Eval epoch: 37
[ Wed Sep 28 04:33:04 2022 ] 	Mean test loss of 296 batches: 0.4262893927107389.
[ Wed Sep 28 04:33:04 2022 ] 	Top1: 86.51%
[ Wed Sep 28 04:33:04 2022 ] 	Top5: 98.63%
[ Wed Sep 28 04:33:04 2022 ] Training epoch: 38
[ Wed Sep 28 04:36:02 2022 ] 	Mean training loss: 0.5218. loss2: 0.0000. Mean training acc: 83.50%.
[ Wed Sep 28 04:36:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:36:02 2022 ] Eval epoch: 38
[ Wed Sep 28 04:36:35 2022 ] 	Mean test loss of 296 batches: 0.4591499952649748.
[ Wed Sep 28 04:36:35 2022 ] 	Top1: 85.29%
[ Wed Sep 28 04:36:35 2022 ] 	Top5: 98.43%
[ Wed Sep 28 04:36:35 2022 ] Training epoch: 39
[ Wed Sep 28 04:39:34 2022 ] 	Mean training loss: 0.5173. loss2: 0.0000. Mean training acc: 83.85%.
[ Wed Sep 28 04:39:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:39:34 2022 ] Eval epoch: 39
[ Wed Sep 28 04:40:07 2022 ] 	Mean test loss of 296 batches: 0.4006089527361296.
[ Wed Sep 28 04:40:07 2022 ] 	Top1: 87.32%
[ Wed Sep 28 04:40:07 2022 ] 	Top5: 98.58%
[ Wed Sep 28 04:40:07 2022 ] Training epoch: 40
[ Wed Sep 28 04:43:06 2022 ] 	Mean training loss: 0.5150. loss2: 0.0000. Mean training acc: 83.73%.
[ Wed Sep 28 04:43:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:43:06 2022 ] Eval epoch: 40
[ Wed Sep 28 04:43:39 2022 ] 	Mean test loss of 296 batches: 0.3854125382227672.
[ Wed Sep 28 04:43:39 2022 ] 	Top1: 87.44%
[ Wed Sep 28 04:43:39 2022 ] 	Top5: 98.68%
[ Wed Sep 28 04:43:39 2022 ] Training epoch: 41
[ Wed Sep 28 04:46:38 2022 ] 	Mean training loss: 0.5148. loss2: 0.0000. Mean training acc: 83.72%.
[ Wed Sep 28 04:46:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:46:38 2022 ] Eval epoch: 41
[ Wed Sep 28 04:47:11 2022 ] 	Mean test loss of 296 batches: 0.5845758400454715.
[ Wed Sep 28 04:47:11 2022 ] 	Top1: 81.95%
[ Wed Sep 28 04:47:11 2022 ] 	Top5: 97.02%
[ Wed Sep 28 04:47:11 2022 ] Training epoch: 42
[ Wed Sep 28 04:50:09 2022 ] 	Mean training loss: 0.5127. loss2: 0.0000. Mean training acc: 83.92%.
[ Wed Sep 28 04:50:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:50:09 2022 ] Eval epoch: 42
[ Wed Sep 28 04:50:43 2022 ] 	Mean test loss of 296 batches: 0.5423742706912595.
[ Wed Sep 28 04:50:43 2022 ] 	Top1: 83.17%
[ Wed Sep 28 04:50:43 2022 ] 	Top5: 97.69%
[ Wed Sep 28 04:50:43 2022 ] Training epoch: 43
[ Wed Sep 28 04:53:41 2022 ] 	Mean training loss: 0.5069. loss2: 0.0000. Mean training acc: 83.97%.
[ Wed Sep 28 04:53:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:53:41 2022 ] Eval epoch: 43
[ Wed Sep 28 04:54:14 2022 ] 	Mean test loss of 296 batches: 0.3997258969859497.
[ Wed Sep 28 04:54:14 2022 ] 	Top1: 87.15%
[ Wed Sep 28 04:54:14 2022 ] 	Top5: 98.61%
[ Wed Sep 28 04:54:14 2022 ] Training epoch: 44
[ Wed Sep 28 04:57:12 2022 ] 	Mean training loss: 0.5046. loss2: 0.0000. Mean training acc: 84.22%.
[ Wed Sep 28 04:57:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:57:13 2022 ] Eval epoch: 44
[ Wed Sep 28 04:57:46 2022 ] 	Mean test loss of 296 batches: 0.4032548716845545.
[ Wed Sep 28 04:57:46 2022 ] 	Top1: 87.21%
[ Wed Sep 28 04:57:46 2022 ] 	Top5: 98.44%
[ Wed Sep 28 04:57:46 2022 ] Training epoch: 45
[ Wed Sep 28 05:00:44 2022 ] 	Mean training loss: 0.5111. loss2: 0.0000. Mean training acc: 84.00%.
[ Wed Sep 28 05:00:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:00:44 2022 ] Eval epoch: 45
[ Wed Sep 28 05:01:18 2022 ] 	Mean test loss of 296 batches: 0.3763181641194466.
[ Wed Sep 28 05:01:18 2022 ] 	Top1: 87.90%
[ Wed Sep 28 05:01:18 2022 ] 	Top5: 98.58%
[ Wed Sep 28 05:01:18 2022 ] Training epoch: 46
[ Wed Sep 28 05:04:16 2022 ] 	Mean training loss: 0.5014. loss2: 0.0000. Mean training acc: 84.19%.
[ Wed Sep 28 05:04:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:04:16 2022 ] Eval epoch: 46
[ Wed Sep 28 05:04:50 2022 ] 	Mean test loss of 296 batches: 0.5029029176645988.
[ Wed Sep 28 05:04:50 2022 ] 	Top1: 84.65%
[ Wed Sep 28 05:04:50 2022 ] 	Top5: 97.52%
[ Wed Sep 28 05:04:50 2022 ] Training epoch: 47
[ Wed Sep 28 05:07:48 2022 ] 	Mean training loss: 0.5005. loss2: 0.0000. Mean training acc: 84.23%.
[ Wed Sep 28 05:07:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:07:48 2022 ] Eval epoch: 47
[ Wed Sep 28 05:08:21 2022 ] 	Mean test loss of 296 batches: 0.5527436691462189.
[ Wed Sep 28 05:08:21 2022 ] 	Top1: 82.85%
[ Wed Sep 28 05:08:22 2022 ] 	Top5: 97.94%
[ Wed Sep 28 05:08:22 2022 ] Training epoch: 48
[ Wed Sep 28 05:11:20 2022 ] 	Mean training loss: 0.5029. loss2: 0.0000. Mean training acc: 84.17%.
[ Wed Sep 28 05:11:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:11:20 2022 ] Eval epoch: 48
[ Wed Sep 28 05:11:53 2022 ] 	Mean test loss of 296 batches: 1.3953275046235807.
[ Wed Sep 28 05:11:53 2022 ] 	Top1: 66.68%
[ Wed Sep 28 05:11:53 2022 ] 	Top5: 89.36%
[ Wed Sep 28 05:11:53 2022 ] Training epoch: 49
[ Wed Sep 28 05:14:52 2022 ] 	Mean training loss: 0.5028. loss2: 0.0000. Mean training acc: 84.19%.
[ Wed Sep 28 05:14:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:14:52 2022 ] Eval epoch: 49
[ Wed Sep 28 05:15:25 2022 ] 	Mean test loss of 296 batches: 0.6426052272923894.
[ Wed Sep 28 05:15:25 2022 ] 	Top1: 79.97%
[ Wed Sep 28 05:15:25 2022 ] 	Top5: 97.01%
[ Wed Sep 28 05:15:25 2022 ] Training epoch: 50
[ Wed Sep 28 05:18:23 2022 ] 	Mean training loss: 0.5048. loss2: 0.0000. Mean training acc: 84.05%.
[ Wed Sep 28 05:18:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:18:23 2022 ] Eval epoch: 50
[ Wed Sep 28 05:18:57 2022 ] 	Mean test loss of 296 batches: 0.4674584588690384.
[ Wed Sep 28 05:18:57 2022 ] 	Top1: 85.15%
[ Wed Sep 28 05:18:57 2022 ] 	Top5: 98.21%
[ Wed Sep 28 05:18:57 2022 ] Training epoch: 51
[ Wed Sep 28 05:21:55 2022 ] 	Mean training loss: 0.4959. loss2: 0.0000. Mean training acc: 84.60%.
[ Wed Sep 28 05:21:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:21:55 2022 ] Eval epoch: 51
[ Wed Sep 28 05:22:28 2022 ] 	Mean test loss of 296 batches: 0.49145458210763093.
[ Wed Sep 28 05:22:29 2022 ] 	Top1: 84.42%
[ Wed Sep 28 05:22:29 2022 ] 	Top5: 98.43%
[ Wed Sep 28 05:22:29 2022 ] Training epoch: 52
[ Wed Sep 28 05:25:27 2022 ] 	Mean training loss: 0.5069. loss2: 0.0000. Mean training acc: 84.00%.
[ Wed Sep 28 05:25:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:25:27 2022 ] Eval epoch: 52
[ Wed Sep 28 05:26:00 2022 ] 	Mean test loss of 296 batches: 0.4000910732492402.
[ Wed Sep 28 05:26:00 2022 ] 	Top1: 87.35%
[ Wed Sep 28 05:26:01 2022 ] 	Top5: 98.65%
[ Wed Sep 28 05:26:01 2022 ] Training epoch: 53
[ Wed Sep 28 05:28:59 2022 ] 	Mean training loss: 0.4994. loss2: 0.0000. Mean training acc: 84.42%.
[ Wed Sep 28 05:28:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:28:59 2022 ] Eval epoch: 53
[ Wed Sep 28 05:29:32 2022 ] 	Mean test loss of 296 batches: 0.4223897319186378.
[ Wed Sep 28 05:29:32 2022 ] 	Top1: 86.81%
[ Wed Sep 28 05:29:32 2022 ] 	Top5: 98.29%
[ Wed Sep 28 05:29:32 2022 ] Training epoch: 54
[ Wed Sep 28 05:32:31 2022 ] 	Mean training loss: 0.4996. loss2: 0.0000. Mean training acc: 84.34%.
[ Wed Sep 28 05:32:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:32:31 2022 ] Eval epoch: 54
[ Wed Sep 28 05:33:05 2022 ] 	Mean test loss of 296 batches: 0.36190602445119135.
[ Wed Sep 28 05:33:05 2022 ] 	Top1: 88.59%
[ Wed Sep 28 05:33:05 2022 ] 	Top5: 98.73%
[ Wed Sep 28 05:33:05 2022 ] Training epoch: 55
[ Wed Sep 28 05:36:03 2022 ] 	Mean training loss: 0.4983. loss2: 0.0000. Mean training acc: 84.34%.
[ Wed Sep 28 05:36:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:36:03 2022 ] Eval epoch: 55
[ Wed Sep 28 05:36:36 2022 ] 	Mean test loss of 296 batches: 0.4693805384827224.
[ Wed Sep 28 05:36:37 2022 ] 	Top1: 85.54%
[ Wed Sep 28 05:36:37 2022 ] 	Top5: 98.07%
[ Wed Sep 28 05:36:37 2022 ] Training epoch: 56
[ Wed Sep 28 05:39:35 2022 ] 	Mean training loss: 0.4961. loss2: 0.0000. Mean training acc: 84.46%.
[ Wed Sep 28 05:39:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:39:35 2022 ] Eval epoch: 56
[ Wed Sep 28 05:40:08 2022 ] 	Mean test loss of 296 batches: 0.40966130637035175.
[ Wed Sep 28 05:40:08 2022 ] 	Top1: 86.75%
[ Wed Sep 28 05:40:09 2022 ] 	Top5: 98.40%
[ Wed Sep 28 05:40:09 2022 ] Training epoch: 57
[ Wed Sep 28 05:43:07 2022 ] 	Mean training loss: 0.4957. loss2: 0.0000. Mean training acc: 84.73%.
[ Wed Sep 28 05:43:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:43:07 2022 ] Eval epoch: 57
[ Wed Sep 28 05:43:40 2022 ] 	Mean test loss of 296 batches: 0.5695923853766274.
[ Wed Sep 28 05:43:40 2022 ] 	Top1: 82.75%
[ Wed Sep 28 05:43:40 2022 ] 	Top5: 98.16%
[ Wed Sep 28 05:43:40 2022 ] Training epoch: 58
[ Wed Sep 28 05:46:39 2022 ] 	Mean training loss: 0.4909. loss2: 0.0000. Mean training acc: 84.54%.
[ Wed Sep 28 05:46:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:46:39 2022 ] Eval epoch: 58
[ Wed Sep 28 05:47:12 2022 ] 	Mean test loss of 296 batches: 0.4807809555550685.
[ Wed Sep 28 05:47:12 2022 ] 	Top1: 85.47%
[ Wed Sep 28 05:47:12 2022 ] 	Top5: 98.12%
[ Wed Sep 28 05:47:12 2022 ] Training epoch: 59
[ Wed Sep 28 05:50:11 2022 ] 	Mean training loss: 0.4930. loss2: 0.0000. Mean training acc: 84.49%.
[ Wed Sep 28 05:50:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:50:11 2022 ] Eval epoch: 59
[ Wed Sep 28 05:50:45 2022 ] 	Mean test loss of 296 batches: 0.4622740339286424.
[ Wed Sep 28 05:50:45 2022 ] 	Top1: 85.55%
[ Wed Sep 28 05:50:45 2022 ] 	Top5: 98.34%
[ Wed Sep 28 05:50:45 2022 ] Training epoch: 60
[ Wed Sep 28 05:53:43 2022 ] 	Mean training loss: 0.4884. loss2: 0.0000. Mean training acc: 84.67%.
[ Wed Sep 28 05:53:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:53:43 2022 ] Eval epoch: 60
[ Wed Sep 28 05:54:17 2022 ] 	Mean test loss of 296 batches: 0.4297064590091641.
[ Wed Sep 28 05:54:17 2022 ] 	Top1: 86.03%
[ Wed Sep 28 05:54:17 2022 ] 	Top5: 98.60%
[ Wed Sep 28 05:54:17 2022 ] Training epoch: 61
[ Wed Sep 28 05:57:15 2022 ] 	Mean training loss: 0.5010. loss2: 0.0000. Mean training acc: 84.29%.
[ Wed Sep 28 05:57:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:57:15 2022 ] Eval epoch: 61
[ Wed Sep 28 05:57:49 2022 ] 	Mean test loss of 296 batches: 0.4463072972925934.
[ Wed Sep 28 05:57:49 2022 ] 	Top1: 86.38%
[ Wed Sep 28 05:57:49 2022 ] 	Top5: 98.20%
[ Wed Sep 28 05:57:49 2022 ] Training epoch: 62
[ Wed Sep 28 06:00:47 2022 ] 	Mean training loss: 0.4950. loss2: 0.0000. Mean training acc: 84.35%.
[ Wed Sep 28 06:00:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:00:47 2022 ] Eval epoch: 62
[ Wed Sep 28 06:01:20 2022 ] 	Mean test loss of 296 batches: 0.4471985259772958.
[ Wed Sep 28 06:01:21 2022 ] 	Top1: 86.13%
[ Wed Sep 28 06:01:21 2022 ] 	Top5: 98.16%
[ Wed Sep 28 06:01:21 2022 ] Training epoch: 63
[ Wed Sep 28 06:04:19 2022 ] 	Mean training loss: 0.4962. loss2: 0.0000. Mean training acc: 84.34%.
[ Wed Sep 28 06:04:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:04:19 2022 ] Eval epoch: 63
[ Wed Sep 28 06:04:52 2022 ] 	Mean test loss of 296 batches: 0.41593813025266735.
[ Wed Sep 28 06:04:53 2022 ] 	Top1: 87.38%
[ Wed Sep 28 06:04:53 2022 ] 	Top5: 98.33%
[ Wed Sep 28 06:04:53 2022 ] Training epoch: 64
[ Wed Sep 28 06:07:51 2022 ] 	Mean training loss: 0.4917. loss2: 0.0000. Mean training acc: 84.42%.
[ Wed Sep 28 06:07:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:07:51 2022 ] Eval epoch: 64
[ Wed Sep 28 06:08:24 2022 ] 	Mean test loss of 296 batches: 0.46809551658461224.
[ Wed Sep 28 06:08:24 2022 ] 	Top1: 85.74%
[ Wed Sep 28 06:08:25 2022 ] 	Top5: 98.21%
[ Wed Sep 28 06:08:25 2022 ] Training epoch: 65
[ Wed Sep 28 06:11:23 2022 ] 	Mean training loss: 0.4918. loss2: 0.0000. Mean training acc: 84.82%.
[ Wed Sep 28 06:11:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:11:23 2022 ] Eval epoch: 65
[ Wed Sep 28 06:11:56 2022 ] 	Mean test loss of 296 batches: 0.42310909707904665.
[ Wed Sep 28 06:11:56 2022 ] 	Top1: 86.55%
[ Wed Sep 28 06:11:56 2022 ] 	Top5: 98.82%
[ Wed Sep 28 06:11:56 2022 ] Training epoch: 66
[ Wed Sep 28 06:14:55 2022 ] 	Mean training loss: 0.4867. loss2: 0.0000. Mean training acc: 84.97%.
[ Wed Sep 28 06:14:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:14:55 2022 ] Eval epoch: 66
[ Wed Sep 28 06:15:28 2022 ] 	Mean test loss of 296 batches: 0.40364615324683284.
[ Wed Sep 28 06:15:28 2022 ] 	Top1: 87.03%
[ Wed Sep 28 06:15:28 2022 ] 	Top5: 98.53%
[ Wed Sep 28 06:15:28 2022 ] Training epoch: 67
[ Wed Sep 28 06:18:26 2022 ] 	Mean training loss: 0.4813. loss2: 0.0000. Mean training acc: 85.01%.
[ Wed Sep 28 06:18:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:18:26 2022 ] Eval epoch: 67
[ Wed Sep 28 06:19:00 2022 ] 	Mean test loss of 296 batches: 0.4172418094617692.
[ Wed Sep 28 06:19:00 2022 ] 	Top1: 86.73%
[ Wed Sep 28 06:19:00 2022 ] 	Top5: 98.55%
[ Wed Sep 28 06:19:00 2022 ] Training epoch: 68
[ Wed Sep 28 06:21:58 2022 ] 	Mean training loss: 0.4949. loss2: 0.0000. Mean training acc: 84.37%.
[ Wed Sep 28 06:21:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:21:58 2022 ] Eval epoch: 68
[ Wed Sep 28 06:22:32 2022 ] 	Mean test loss of 296 batches: 0.38557999047475894.
[ Wed Sep 28 06:22:32 2022 ] 	Top1: 88.10%
[ Wed Sep 28 06:22:32 2022 ] 	Top5: 98.68%
[ Wed Sep 28 06:22:32 2022 ] Training epoch: 69
[ Wed Sep 28 06:25:31 2022 ] 	Mean training loss: 0.4895. loss2: 0.0000. Mean training acc: 84.45%.
[ Wed Sep 28 06:25:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:25:31 2022 ] Eval epoch: 69
[ Wed Sep 28 06:26:05 2022 ] 	Mean test loss of 296 batches: 0.38763807681263296.
[ Wed Sep 28 06:26:05 2022 ] 	Top1: 87.79%
[ Wed Sep 28 06:26:05 2022 ] 	Top5: 98.51%
[ Wed Sep 28 06:26:05 2022 ] Training epoch: 70
[ Wed Sep 28 06:29:03 2022 ] 	Mean training loss: 0.4879. loss2: 0.0000. Mean training acc: 84.66%.
[ Wed Sep 28 06:29:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:29:03 2022 ] Eval epoch: 70
[ Wed Sep 28 06:29:37 2022 ] 	Mean test loss of 296 batches: 0.5126646757125854.
[ Wed Sep 28 06:29:37 2022 ] 	Top1: 83.56%
[ Wed Sep 28 06:29:37 2022 ] 	Top5: 98.08%
[ Wed Sep 28 06:29:37 2022 ] Training epoch: 71
[ Wed Sep 28 06:32:35 2022 ] 	Mean training loss: 0.4899. loss2: 0.0000. Mean training acc: 84.51%.
[ Wed Sep 28 06:32:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:32:35 2022 ] Eval epoch: 71
[ Wed Sep 28 06:33:09 2022 ] 	Mean test loss of 296 batches: 0.43562824540847056.
[ Wed Sep 28 06:33:09 2022 ] 	Top1: 86.85%
[ Wed Sep 28 06:33:09 2022 ] 	Top5: 98.35%
[ Wed Sep 28 06:33:09 2022 ] Training epoch: 72
[ Wed Sep 28 06:36:07 2022 ] 	Mean training loss: 0.4942. loss2: 0.0000. Mean training acc: 84.48%.
[ Wed Sep 28 06:36:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:36:07 2022 ] Eval epoch: 72
[ Wed Sep 28 06:36:41 2022 ] 	Mean test loss of 296 batches: 0.39736295373153846.
[ Wed Sep 28 06:36:41 2022 ] 	Top1: 87.50%
[ Wed Sep 28 06:36:41 2022 ] 	Top5: 98.40%
[ Wed Sep 28 06:36:41 2022 ] Training epoch: 73
[ Wed Sep 28 06:39:39 2022 ] 	Mean training loss: 0.4898. loss2: 0.0000. Mean training acc: 84.58%.
[ Wed Sep 28 06:39:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:39:39 2022 ] Eval epoch: 73
[ Wed Sep 28 06:40:12 2022 ] 	Mean test loss of 296 batches: 0.39663898454022567.
[ Wed Sep 28 06:40:13 2022 ] 	Top1: 87.01%
[ Wed Sep 28 06:40:13 2022 ] 	Top5: 98.58%
[ Wed Sep 28 06:40:13 2022 ] Training epoch: 74
[ Wed Sep 28 06:43:11 2022 ] 	Mean training loss: 0.4795. loss2: 0.0000. Mean training acc: 84.90%.
[ Wed Sep 28 06:43:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:43:11 2022 ] Eval epoch: 74
[ Wed Sep 28 06:43:44 2022 ] 	Mean test loss of 296 batches: 0.5124697875533555.
[ Wed Sep 28 06:43:44 2022 ] 	Top1: 84.26%
[ Wed Sep 28 06:43:44 2022 ] 	Top5: 98.34%
[ Wed Sep 28 06:43:44 2022 ] Training epoch: 75
[ Wed Sep 28 06:46:42 2022 ] 	Mean training loss: 0.4874. loss2: 0.0000. Mean training acc: 84.61%.
[ Wed Sep 28 06:46:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:46:42 2022 ] Eval epoch: 75
[ Wed Sep 28 06:47:16 2022 ] 	Mean test loss of 296 batches: 0.4333131778985262.
[ Wed Sep 28 06:47:16 2022 ] 	Top1: 86.21%
[ Wed Sep 28 06:47:16 2022 ] 	Top5: 98.71%
[ Wed Sep 28 06:47:16 2022 ] Training epoch: 76
[ Wed Sep 28 06:50:14 2022 ] 	Mean training loss: 0.4874. loss2: 0.0000. Mean training acc: 84.82%.
[ Wed Sep 28 06:50:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:50:14 2022 ] Eval epoch: 76
[ Wed Sep 28 06:50:47 2022 ] 	Mean test loss of 296 batches: 0.41342469465893666.
[ Wed Sep 28 06:50:47 2022 ] 	Top1: 86.78%
[ Wed Sep 28 06:50:47 2022 ] 	Top5: 98.70%
[ Wed Sep 28 06:50:47 2022 ] Training epoch: 77
[ Wed Sep 28 06:53:46 2022 ] 	Mean training loss: 0.4821. loss2: 0.0000. Mean training acc: 84.89%.
[ Wed Sep 28 06:53:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:53:46 2022 ] Eval epoch: 77
[ Wed Sep 28 06:54:19 2022 ] 	Mean test loss of 296 batches: 0.4109548421425594.
[ Wed Sep 28 06:54:19 2022 ] 	Top1: 86.93%
[ Wed Sep 28 06:54:19 2022 ] 	Top5: 98.46%
[ Wed Sep 28 06:54:19 2022 ] Training epoch: 78
[ Wed Sep 28 06:57:17 2022 ] 	Mean training loss: 0.4847. loss2: 0.0000. Mean training acc: 84.60%.
[ Wed Sep 28 06:57:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:57:17 2022 ] Eval epoch: 78
[ Wed Sep 28 06:57:51 2022 ] 	Mean test loss of 296 batches: 0.4371535089873784.
[ Wed Sep 28 06:57:51 2022 ] 	Top1: 86.04%
[ Wed Sep 28 06:57:51 2022 ] 	Top5: 98.39%
[ Wed Sep 28 06:57:51 2022 ] Training epoch: 79
[ Wed Sep 28 07:00:49 2022 ] 	Mean training loss: 0.4828. loss2: 0.0000. Mean training acc: 84.76%.
[ Wed Sep 28 07:00:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:00:49 2022 ] Eval epoch: 79
[ Wed Sep 28 07:01:22 2022 ] 	Mean test loss of 296 batches: 0.46074171568191535.
[ Wed Sep 28 07:01:22 2022 ] 	Top1: 85.34%
[ Wed Sep 28 07:01:23 2022 ] 	Top5: 98.26%
[ Wed Sep 28 07:01:23 2022 ] Training epoch: 80
[ Wed Sep 28 07:04:21 2022 ] 	Mean training loss: 0.4840. loss2: 0.0000. Mean training acc: 84.86%.
[ Wed Sep 28 07:04:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:04:21 2022 ] Eval epoch: 80
[ Wed Sep 28 07:04:54 2022 ] 	Mean test loss of 296 batches: 0.4435298127197736.
[ Wed Sep 28 07:04:54 2022 ] 	Top1: 86.09%
[ Wed Sep 28 07:04:54 2022 ] 	Top5: 98.34%
[ Wed Sep 28 07:04:54 2022 ] Training epoch: 81
[ Wed Sep 28 07:07:53 2022 ] 	Mean training loss: 0.4807. loss2: 0.0000. Mean training acc: 84.89%.
[ Wed Sep 28 07:07:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:07:53 2022 ] Eval epoch: 81
[ Wed Sep 28 07:08:26 2022 ] 	Mean test loss of 296 batches: 0.521737318536317.
[ Wed Sep 28 07:08:26 2022 ] 	Top1: 83.72%
[ Wed Sep 28 07:08:26 2022 ] 	Top5: 97.54%
[ Wed Sep 28 07:08:26 2022 ] Training epoch: 82
[ Wed Sep 28 07:11:25 2022 ] 	Mean training loss: 0.4857. loss2: 0.0000. Mean training acc: 84.65%.
[ Wed Sep 28 07:11:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:11:25 2022 ] Eval epoch: 82
[ Wed Sep 28 07:11:58 2022 ] 	Mean test loss of 296 batches: 0.36607772110634157.
[ Wed Sep 28 07:11:58 2022 ] 	Top1: 88.31%
[ Wed Sep 28 07:11:58 2022 ] 	Top5: 98.74%
[ Wed Sep 28 07:11:58 2022 ] Training epoch: 83
[ Wed Sep 28 07:14:56 2022 ] 	Mean training loss: 0.4865. loss2: 0.0000. Mean training acc: 84.69%.
[ Wed Sep 28 07:14:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:14:56 2022 ] Eval epoch: 83
[ Wed Sep 28 07:15:30 2022 ] 	Mean test loss of 296 batches: 0.3980946048490099.
[ Wed Sep 28 07:15:30 2022 ] 	Top1: 87.19%
[ Wed Sep 28 07:15:30 2022 ] 	Top5: 98.61%
[ Wed Sep 28 07:15:30 2022 ] Training epoch: 84
[ Wed Sep 28 07:18:28 2022 ] 	Mean training loss: 0.4865. loss2: 0.0000. Mean training acc: 84.71%.
[ Wed Sep 28 07:18:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:18:28 2022 ] Eval epoch: 84
[ Wed Sep 28 07:19:02 2022 ] 	Mean test loss of 296 batches: 0.45528030133730657.
[ Wed Sep 28 07:19:02 2022 ] 	Top1: 85.57%
[ Wed Sep 28 07:19:02 2022 ] 	Top5: 98.30%
[ Wed Sep 28 07:19:02 2022 ] Training epoch: 85
[ Wed Sep 28 07:22:00 2022 ] 	Mean training loss: 0.4823. loss2: 0.0000. Mean training acc: 84.89%.
[ Wed Sep 28 07:22:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:22:00 2022 ] Eval epoch: 85
[ Wed Sep 28 07:22:33 2022 ] 	Mean test loss of 296 batches: 0.541466766011876.
[ Wed Sep 28 07:22:34 2022 ] 	Top1: 82.74%
[ Wed Sep 28 07:22:34 2022 ] 	Top5: 97.91%
[ Wed Sep 28 07:22:34 2022 ] Training epoch: 86
[ Wed Sep 28 07:25:32 2022 ] 	Mean training loss: 0.4839. loss2: 0.0000. Mean training acc: 84.69%.
[ Wed Sep 28 07:25:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:25:32 2022 ] Eval epoch: 86
[ Wed Sep 28 07:26:06 2022 ] 	Mean test loss of 296 batches: 0.44475694791086623.
[ Wed Sep 28 07:26:06 2022 ] 	Top1: 86.11%
[ Wed Sep 28 07:26:06 2022 ] 	Top5: 98.42%
[ Wed Sep 28 07:26:06 2022 ] Training epoch: 87
[ Wed Sep 28 07:29:04 2022 ] 	Mean training loss: 0.4834. loss2: 0.0000. Mean training acc: 84.84%.
[ Wed Sep 28 07:29:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:29:04 2022 ] Eval epoch: 87
[ Wed Sep 28 07:29:37 2022 ] 	Mean test loss of 296 batches: 0.4265095559326378.
[ Wed Sep 28 07:29:37 2022 ] 	Top1: 86.40%
[ Wed Sep 28 07:29:37 2022 ] 	Top5: 98.41%
[ Wed Sep 28 07:29:37 2022 ] Training epoch: 88
[ Wed Sep 28 07:32:36 2022 ] 	Mean training loss: 0.4876. loss2: 0.0000. Mean training acc: 84.64%.
[ Wed Sep 28 07:32:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:32:36 2022 ] Eval epoch: 88
[ Wed Sep 28 07:33:09 2022 ] 	Mean test loss of 296 batches: 0.46556379784502694.
[ Wed Sep 28 07:33:09 2022 ] 	Top1: 85.15%
[ Wed Sep 28 07:33:09 2022 ] 	Top5: 98.44%
[ Wed Sep 28 07:33:09 2022 ] Training epoch: 89
[ Wed Sep 28 07:36:07 2022 ] 	Mean training loss: 0.4850. loss2: 0.0000. Mean training acc: 84.72%.
[ Wed Sep 28 07:36:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:36:08 2022 ] Eval epoch: 89
[ Wed Sep 28 07:36:41 2022 ] 	Mean test loss of 296 batches: 0.42005363972605886.
[ Wed Sep 28 07:36:41 2022 ] 	Top1: 86.24%
[ Wed Sep 28 07:36:41 2022 ] 	Top5: 98.62%
[ Wed Sep 28 07:36:41 2022 ] Training epoch: 90
[ Wed Sep 28 07:39:39 2022 ] 	Mean training loss: 0.4807. loss2: 0.0000. Mean training acc: 84.91%.
[ Wed Sep 28 07:39:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:39:39 2022 ] Eval epoch: 90
[ Wed Sep 28 07:40:13 2022 ] 	Mean test loss of 296 batches: 0.45148580307392655.
[ Wed Sep 28 07:40:13 2022 ] 	Top1: 85.59%
[ Wed Sep 28 07:40:13 2022 ] 	Top5: 98.58%
[ Wed Sep 28 07:40:13 2022 ] Training epoch: 91
[ Wed Sep 28 07:43:11 2022 ] 	Mean training loss: 0.2827. loss2: 0.0000. Mean training acc: 91.24%.
[ Wed Sep 28 07:43:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:43:11 2022 ] Eval epoch: 91
[ Wed Sep 28 07:43:45 2022 ] 	Mean test loss of 296 batches: 0.18310122691862588.
[ Wed Sep 28 07:43:45 2022 ] 	Top1: 94.33%
[ Wed Sep 28 07:43:45 2022 ] 	Top5: 99.33%
[ Wed Sep 28 07:43:45 2022 ] Training epoch: 92
[ Wed Sep 28 07:46:43 2022 ] 	Mean training loss: 0.2264. loss2: 0.0000. Mean training acc: 93.03%.
[ Wed Sep 28 07:46:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:46:43 2022 ] Eval epoch: 92
[ Wed Sep 28 07:47:17 2022 ] 	Mean test loss of 296 batches: 0.174626910045894.
[ Wed Sep 28 07:47:17 2022 ] 	Top1: 94.50%
[ Wed Sep 28 07:47:17 2022 ] 	Top5: 99.36%
[ Wed Sep 28 07:47:17 2022 ] Training epoch: 93
[ Wed Sep 28 07:50:16 2022 ] 	Mean training loss: 0.2000. loss2: 0.0000. Mean training acc: 93.87%.
[ Wed Sep 28 07:50:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:50:16 2022 ] Eval epoch: 93
[ Wed Sep 28 07:50:50 2022 ] 	Mean test loss of 296 batches: 0.1708771691380723.
[ Wed Sep 28 07:50:50 2022 ] 	Top1: 94.67%
[ Wed Sep 28 07:50:50 2022 ] 	Top5: 99.40%
[ Wed Sep 28 07:50:50 2022 ] Training epoch: 94
[ Wed Sep 28 07:53:50 2022 ] 	Mean training loss: 0.1826. loss2: 0.0000. Mean training acc: 94.34%.
[ Wed Sep 28 07:53:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:53:50 2022 ] Eval epoch: 94
[ Wed Sep 28 07:54:23 2022 ] 	Mean test loss of 296 batches: 0.1637177661036116.
[ Wed Sep 28 07:54:23 2022 ] 	Top1: 94.93%
[ Wed Sep 28 07:54:23 2022 ] 	Top5: 99.45%
[ Wed Sep 28 07:54:23 2022 ] Training epoch: 95
[ Wed Sep 28 07:57:21 2022 ] 	Mean training loss: 0.1667. loss2: 0.0000. Mean training acc: 94.89%.
[ Wed Sep 28 07:57:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:57:21 2022 ] Eval epoch: 95
[ Wed Sep 28 07:57:55 2022 ] 	Mean test loss of 296 batches: 0.16510488801459605.
[ Wed Sep 28 07:57:55 2022 ] 	Top1: 94.74%
[ Wed Sep 28 07:57:55 2022 ] 	Top5: 99.40%
[ Wed Sep 28 07:57:55 2022 ] Training epoch: 96
[ Wed Sep 28 08:00:53 2022 ] 	Mean training loss: 0.1534. loss2: 0.0000. Mean training acc: 95.40%.
[ Wed Sep 28 08:00:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:00:53 2022 ] Eval epoch: 96
[ Wed Sep 28 08:01:27 2022 ] 	Mean test loss of 296 batches: 0.16107286044673338.
[ Wed Sep 28 08:01:27 2022 ] 	Top1: 95.11%
[ Wed Sep 28 08:01:27 2022 ] 	Top5: 99.40%
[ Wed Sep 28 08:01:27 2022 ] Training epoch: 97
[ Wed Sep 28 08:04:26 2022 ] 	Mean training loss: 0.1434. loss2: 0.0000. Mean training acc: 95.68%.
[ Wed Sep 28 08:04:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:04:26 2022 ] Eval epoch: 97
[ Wed Sep 28 08:04:59 2022 ] 	Mean test loss of 296 batches: 0.1661876848225507.
[ Wed Sep 28 08:04:59 2022 ] 	Top1: 94.96%
[ Wed Sep 28 08:04:59 2022 ] 	Top5: 99.38%
[ Wed Sep 28 08:04:59 2022 ] Training epoch: 98
[ Wed Sep 28 08:07:58 2022 ] 	Mean training loss: 0.1372. loss2: 0.0000. Mean training acc: 95.95%.
[ Wed Sep 28 08:07:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:07:58 2022 ] Eval epoch: 98
[ Wed Sep 28 08:08:31 2022 ] 	Mean test loss of 296 batches: 0.16107087105990867.
[ Wed Sep 28 08:08:31 2022 ] 	Top1: 95.05%
[ Wed Sep 28 08:08:31 2022 ] 	Top5: 99.39%
[ Wed Sep 28 08:08:31 2022 ] Training epoch: 99
[ Wed Sep 28 08:11:29 2022 ] 	Mean training loss: 0.1279. loss2: 0.0000. Mean training acc: 96.36%.
[ Wed Sep 28 08:11:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:11:29 2022 ] Eval epoch: 99
[ Wed Sep 28 08:12:03 2022 ] 	Mean test loss of 296 batches: 0.1715007544702825.
[ Wed Sep 28 08:12:03 2022 ] 	Top1: 94.89%
[ Wed Sep 28 08:12:03 2022 ] 	Top5: 99.39%
[ Wed Sep 28 08:12:03 2022 ] Training epoch: 100
[ Wed Sep 28 08:15:01 2022 ] 	Mean training loss: 0.1244. loss2: 0.0000. Mean training acc: 96.41%.
[ Wed Sep 28 08:15:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:15:01 2022 ] Eval epoch: 100
[ Wed Sep 28 08:15:35 2022 ] 	Mean test loss of 296 batches: 0.1689377749068159.
[ Wed Sep 28 08:15:35 2022 ] 	Top1: 94.80%
[ Wed Sep 28 08:15:35 2022 ] 	Top5: 99.29%
[ Wed Sep 28 08:15:35 2022 ] Training epoch: 101
[ Wed Sep 28 08:18:33 2022 ] 	Mean training loss: 0.0959. loss2: 0.0000. Mean training acc: 97.52%.
[ Wed Sep 28 08:18:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:18:33 2022 ] Eval epoch: 101
[ Wed Sep 28 08:19:06 2022 ] 	Mean test loss of 296 batches: 0.15717439618940163.
[ Wed Sep 28 08:19:06 2022 ] 	Top1: 95.36%
[ Wed Sep 28 08:19:07 2022 ] 	Top5: 99.35%
[ Wed Sep 28 08:19:07 2022 ] Training epoch: 102
[ Wed Sep 28 08:22:05 2022 ] 	Mean training loss: 0.0889. loss2: 0.0000. Mean training acc: 97.72%.
[ Wed Sep 28 08:22:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:22:05 2022 ] Eval epoch: 102
[ Wed Sep 28 08:22:38 2022 ] 	Mean test loss of 296 batches: 0.15969644245892964.
[ Wed Sep 28 08:22:38 2022 ] 	Top1: 95.28%
[ Wed Sep 28 08:22:38 2022 ] 	Top5: 99.33%
[ Wed Sep 28 08:22:38 2022 ] Training epoch: 103
[ Wed Sep 28 08:25:36 2022 ] 	Mean training loss: 0.0836. loss2: 0.0000. Mean training acc: 97.90%.
[ Wed Sep 28 08:25:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:25:37 2022 ] Eval epoch: 103
[ Wed Sep 28 08:26:10 2022 ] 	Mean test loss of 296 batches: 0.15773722676818283.
[ Wed Sep 28 08:26:10 2022 ] 	Top1: 95.45%
[ Wed Sep 28 08:26:10 2022 ] 	Top5: 99.37%
[ Wed Sep 28 08:26:10 2022 ] Training epoch: 104
[ Wed Sep 28 08:29:08 2022 ] 	Mean training loss: 0.0794. loss2: 0.0000. Mean training acc: 97.99%.
[ Wed Sep 28 08:29:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:29:08 2022 ] Eval epoch: 104
[ Wed Sep 28 08:29:42 2022 ] 	Mean test loss of 296 batches: 0.15635588087418392.
[ Wed Sep 28 08:29:42 2022 ] 	Top1: 95.46%
[ Wed Sep 28 08:29:42 2022 ] 	Top5: 99.37%
[ Wed Sep 28 08:29:42 2022 ] Training epoch: 105
[ Wed Sep 28 08:32:40 2022 ] 	Mean training loss: 0.0783. loss2: 0.0000. Mean training acc: 98.04%.
[ Wed Sep 28 08:32:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:32:40 2022 ] Eval epoch: 105
[ Wed Sep 28 08:33:14 2022 ] 	Mean test loss of 296 batches: 0.1586542977437037.
[ Wed Sep 28 08:33:14 2022 ] 	Top1: 95.37%
[ Wed Sep 28 08:33:14 2022 ] 	Top5: 99.33%
[ Wed Sep 28 08:33:14 2022 ] Training epoch: 106
[ Wed Sep 28 08:36:12 2022 ] 	Mean training loss: 0.0736. loss2: 0.0000. Mean training acc: 98.21%.
[ Wed Sep 28 08:36:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:36:12 2022 ] Eval epoch: 106
[ Wed Sep 28 08:36:46 2022 ] 	Mean test loss of 296 batches: 0.15761841128492174.
[ Wed Sep 28 08:36:46 2022 ] 	Top1: 95.42%
[ Wed Sep 28 08:36:46 2022 ] 	Top5: 99.36%
[ Wed Sep 28 08:36:46 2022 ] Training epoch: 107
[ Wed Sep 28 08:39:45 2022 ] 	Mean training loss: 0.0725. loss2: 0.0000. Mean training acc: 98.24%.
[ Wed Sep 28 08:39:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:39:45 2022 ] Eval epoch: 107
[ Wed Sep 28 08:40:18 2022 ] 	Mean test loss of 296 batches: 0.15943847506030187.
[ Wed Sep 28 08:40:18 2022 ] 	Top1: 95.39%
[ Wed Sep 28 08:40:18 2022 ] 	Top5: 99.32%
[ Wed Sep 28 08:40:18 2022 ] Training epoch: 108
[ Wed Sep 28 08:43:16 2022 ] 	Mean training loss: 0.0724. loss2: 0.0000. Mean training acc: 98.12%.
[ Wed Sep 28 08:43:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:43:17 2022 ] Eval epoch: 108
[ Wed Sep 28 08:43:50 2022 ] 	Mean test loss of 296 batches: 0.15897099910039655.
[ Wed Sep 28 08:43:50 2022 ] 	Top1: 95.42%
[ Wed Sep 28 08:43:50 2022 ] 	Top5: 99.35%
[ Wed Sep 28 08:43:50 2022 ] Training epoch: 109
[ Wed Sep 28 08:46:48 2022 ] 	Mean training loss: 0.0695. loss2: 0.0000. Mean training acc: 98.34%.
[ Wed Sep 28 08:46:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:46:48 2022 ] Eval epoch: 109
[ Wed Sep 28 08:47:22 2022 ] 	Mean test loss of 296 batches: 0.15975180794044422.
[ Wed Sep 28 08:47:22 2022 ] 	Top1: 95.47%
[ Wed Sep 28 08:47:22 2022 ] 	Top5: 99.31%
[ Wed Sep 28 08:47:22 2022 ] Training epoch: 110
[ Wed Sep 28 08:50:20 2022 ] 	Mean training loss: 0.0686. loss2: 0.0000. Mean training acc: 98.39%.
[ Wed Sep 28 08:50:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:50:20 2022 ] Eval epoch: 110
[ Wed Sep 28 08:50:53 2022 ] 	Mean test loss of 296 batches: 0.1605704743194872.
[ Wed Sep 28 08:50:53 2022 ] 	Top1: 95.55%
[ Wed Sep 28 08:50:53 2022 ] 	Top5: 99.33%
[ Wed Sep 28 08:51:26 2022 ] Best accuracy: 0.9554722163532643
[ Wed Sep 28 08:51:26 2022 ] Epoch number: 110
[ Wed Sep 28 08:51:26 2022 ] Model name: work_dir/ntu60/cview/fc_bone
[ Wed Sep 28 08:51:26 2022 ] Model total number of params: 2082097
[ Wed Sep 28 08:51:26 2022 ] Weight decay: 0.0004
[ Wed Sep 28 08:51:26 2022 ] Base LR: 0.1
[ Wed Sep 28 08:51:26 2022 ] Batch Size: 64
[ Wed Sep 28 08:51:26 2022 ] Test Batch Size: 64
[ Wed Sep 28 08:51:26 2022 ] seed: 1
